🔔 FCM Loaded

AVP/VP - Databricks Architect || Unity Catalog

Impetus Career Consultants

2 - 5 years

Hyderabad

Posted: 28/12/2025

Getting a referral is 5x more effective than applying directly

Job Description

Job Title: Databricks Architect

Location: Hyderabad, India

Experience: 12+ Years

Employment Type: Full-Time

Job Summary

We are seeking an experienced Databricks Architect with 12+ years of overall experience to lead the design, architecture, and implementation of scalable data platforms using Databricks . The ideal candidate will have deep expertise in Databricks Lakehouse architecture , Unity Catalog , cloud data engineering, and strong governance and security practices. Databricks certifications are mandatory.

Key Responsibilities

  • Architect, design, and implement end-to-end data solutions using Databricks Lakehouse on cloud platforms (Azure/AWS/GCP).
  • Define and drive enterprise-scale architectures for data ingestion, processing, analytics, and machine learning workloads.
  • Lead implementation and governance using Unity Catalog , including data access controls, lineage, auditing, and metadata management.
  • Establish best practices for data modeling, performance tuning, cost optimization, and CI/CD for Databricks workloads.
  • Collaborate with data engineering, analytics, ML, security, and platform teams to deliver robust and compliant solutions.
  • Provide technical leadership, mentorship, and architectural guidance to development teams.
  • Ensure adherence to data security, privacy, and compliance standards.
  • Evaluate and integrate new Databricks features and ecosystem tools.
  • Act as a key stakeholder in solution reviews, design approvals, and architectural governance forums.

Required Skills & Experience

  • 12+ years of overall IT experience with strong focus on data engineering and architecture.
  • 5+ years of hands-on experience with Databricks in production environments.
  • Deep expertise in Databricks Lakehouse , Delta Lake , Unity Catalog , and workspace administration.
  • Strong hands-on experience with Apache Spark (PySpark/Scala) .
  • Experience with cloud platforms: Azure (preferred) , AWS, or GCP.
  • Solid understanding of data governance, security, IAM, RBAC/ABAC, and regulatory compliance.
  • Experience designing batch and streaming pipelines (e.g., Auto Loader, Structured Streaming).
  • Strong SQL skills and experience with data warehousing and analytics workloads.
  • Proven ability to architect scalable, high-performance, and cost-efficient data solutions.

Certifications (Mandatory)

  • Databricks Certified Data Engineer Professional and/or
  • Databricks Certified Solutions Architect (preferred)
  • Cloud certifications (Azure/AWS/GCP) are a plus

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.